alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
Weights’ matrix is maintained while changing between variants;
It is necessary to check and transform the matrices dimensions according to the variants’ number of neurons.
alt
alt
alt
Iris Dataset
It includes three iris species with 50 samples each as well as some properties about each flower. One flower species is linearly separable from the other two, but the other two are not linearly separable from each other.
The columns in this dataset are:
Class: - Iris Setosa - Iris Versicolour
alt
alt
---
title: "Adaptive Multilayer Perceptron"
output:
flexdashboard::flex_dashboard:
storyboard: true
social: menu
source: embed
---
```{r setup, include=FALSE}
library(flexdashboard)
```
### Apresentation.

### Introduction.

***
1. Artificial Neural Networks
- The neural network itself is not an algorithm, but rather a framework for many different machine learning algorithms to work together and process complex data inputs.
### Introduction.

***
1. Multilayer Perceptron
- A Multilayer Perceptron is a class of feedforward artificial neural networking.
### Introduction.

***
1. Deep Neural Networks:
- Computer vision;
- Speech recognition;
- Natural language processing.
### Introduction.

***
- Convolutional Neural Network (CNN)
### Introduction.

***
1. GoogleNet
### Introduction.

***
1. Long Short-Term Memory (LSTM)
- Are a special kind of RNN, capable of learning long-term dependencies.
### Motivation.

***
1. Deep Learning Challenges
- Best architecture for a specific task
- Unsupervised training
- Generalization for many tasks
2. Neuro Evolution
- Neuro Evolution is a subfield within artificial intelligence (AI) and machine learning (ML) that consists of trying to trigger an evolutionary process similar to the one that produced our brains, except inside a computer. In other words, neuroevolution seeks to develop the means of evolving neural networks through evolutionary algorithms.
### Emergent Neural Networks.

### Emergent Multilayer Perceptron.

***
1. Our Proposal.
- Give the neural network the possibility of changing the number of neurons during the learning process;
- Find the best model;
- Increase learning accuracy;
- Decrease the error;
- Decrease training time.
### Multilayer Perceptron

### Forward Propagation

### Gradient Descent.

### Back Propagation.

### Back Propagation.

### Back Propagation.

### Component-based MLP.

***
1. Components
### Adaptation.

***
- Weights’ matrix is maintained while changing between variants;
- It is necessary to check and transform the matrices dimensions according to the variants’ number of neurons.
### Perceptron.

### Learning.

***
1. getPerceptionData() : Leaning Accuracy
2. setConfig(last_config) : Hidden Layer with different number of neurons
### Demo.

### Data.
```{r}
pairs(iris[1:4], main = "Anderson's Iris Data -- 3 species", pch = 21, bg = c("red", "green3", "blue")[unclass(iris$Species)], lower.panel=NULL, labels=c("SL","SW","PL","PW"), font.labels=2, cex.labels=4.5)
```
***
Iris Dataset
1. It includes three iris species with 50 samples each as well as some properties about each flower. One flower species is linearly separable from the other two, but the other two are not linearly separable from each other.
2. The columns in this dataset are:
- SepalLength
- SepalWidth
- PetalLength
- PetalWidth
Class:
- Iris Setosa
- Iris Versicolour
### The Code.

### Questions.
